4,727 research outputs found

    Diffusion on an Ising chain with kinks

    Get PDF
    We count the number of histories between the two degenerate minimum energy configurations of the Ising model on a chain, as a function of the length n and the number d of kinks that appear above the critical temperature. This is equivalent to count permutations of length n avoiding certain subsequences depending on d. We give explicit generating functions and compute the asymptotics. The setting considered has a role when describing dynamics induced by quantum Hamiltonians with deconfined quasi-particles. (C) 2009 Elsevier B.V. All rights reserved

    Wick's theorem for q-deformed boson operators

    Get PDF
    In this paper combinatorial aspects of normal ordering arbitrary words in the creation and annihilation operators of the q-deformed boson are discussed. In particular, it is shown how by introducing appropriate q-weights for the associated ``Feynman diagrams'' the normally ordered form of a general expression in the creation and annihilation operators can be written as a sum over all q-weighted Feynman diagrams, representing Wick's theorem in the present context.Comment: 9 page

    Longitudinal LASSO: Jointly Learning Features and Temporal Contingency for Outcome Prediction

    Full text link
    Longitudinal analysis is important in many disciplines, such as the study of behavioral transitions in social science. Only very recently, feature selection has drawn adequate attention in the context of longitudinal modeling. Standard techniques, such as generalized estimating equations, have been modified to select features by imposing sparsity-inducing regularizers. However, they do not explicitly model how a dependent variable relies on features measured at proximal time points. Recent graphical Granger modeling can select features in lagged time points but ignores the temporal correlations within an individual's repeated measurements. We propose an approach to automatically and simultaneously determine both the relevant features and the relevant temporal points that impact the current outcome of the dependent variable. Meanwhile, the proposed model takes into account the non-{\em i.i.d} nature of the data by estimating the within-individual correlations. This approach decomposes model parameters into a summation of two components and imposes separate block-wise LASSO penalties to each component when building a linear model in terms of the past τ\tau measurements of features. One component is used to select features whereas the other is used to select temporal contingent points. An accelerated gradient descent algorithm is developed to efficiently solve the related optimization problem with detailed convergence analysis and asymptotic analysis. Computational results on both synthetic and real world problems demonstrate the superior performance of the proposed approach over existing techniques.Comment: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 201

    Multi-core job submission and grid resource scheduling for ATLAS AthenaMP

    Get PDF
    AthenaMP is the multi-core implementation of the ATLAS software framework and allows the efficient sharing of memory pages between multiple threads of execution. This has now been validated for production and delivers a significant reduction on the overall application memory footprint with negligible CPU overhead. Before AthenaMP can be routinely run on the LHC Computing Grid it must be determined how the computing resources available to ATLAS can best exploit the notable improvements delivered by switching to this multi-process model. A study into the effectiveness and scalability of AthenaMP in a production environment will be presented. Best practices for configuring the main LRMS implementations currently used by grid sites will be identified in the context of multi-core scheduling optimisation

    Matrix permanent and quantum entanglement of permutation invariant states

    Full text link
    We point out that a geometric measure of quantum entanglement is related to the matrix permanent when restricted to permutation invariant states. This connection allows us to interpret the permanent as an angle between vectors. By employing a recently introduced permanent inequality by Carlen, Loss and Lieb, we can prove explicit formulas of the geometric measure for permutation invariant basis states in a simple way.Comment: 10 page

    IMMUNITY TO DIPHTHERIA IN SIENA.

    Get PDF

    A general algorithm for manipulating non-linear and linear entanglement witnesses by using exact convex optimization

    Full text link
    A generic algorithm is developed to reduce the problem of obtaining linear and nonlinear entanglement witnesses of a given quantum system, to convex optimization problem. This approach is completely general and can be applied for the entanglement detection of any N-partite quantum system. For this purpose, a map from convex space of separable density matrices to a convex region called feasible region is defined, where by using exact convex optimization method, the linear entanglement witnesses can be obtained from polygonal shape feasible regions, while for curved shape feasible regions, envelope of the family of linear entanglement witnesses can be considered as nonlinear entanglement witnesses. This method proposes a new methodological framework within which most of previous EWs can be studied. To conclude and in order to demonstrate the capability of the proposed approach, besides providing some nonlinear witnesses for entanglement detection of density matrices in unextendible product bases, W-states, and GHZ with W-states, some further examples of three qubits systems and their classification and entanglement detection are included. Also it is explained how one can manipulate most of the non-decomposable linear and nonlinear three qubits entanglement witnesses appearing in some of the papers published by us and other authors, by the method proposed in this paper. Keywords: non-linear and linear entanglement witnesses, convex optimization. PACS number(s): 03.67.Mn, 03.65.UdComment: 37 page

    Equitability revisited: why the “equitable threat score” is not equitable

    Get PDF
    In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures
    • 

    corecore